Skip to content

Conversation

@alpha-er
Copy link

Add flexible embedding provider configuration to support both OpenAI and custom embedding endpoints (e.g., LiteLLM, local models). This enables users to use alternative embedding services while maintaining backward compatibility with OpenAI.

Add flexible embedding provider configuration to support both OpenAI
and custom embedding endpoints (e.g., LiteLLM, local models). This
enables users to use alternative embedding services while maintaining
backward compatibility with OpenAI.

Signed-off-by: ER Hapal <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant